telepresence system
Deformation of the panoramic sphere into an ellipsoid to induce self-motion in telepresence users
Laukka, Eetu, Center, Evan G., Ojala, Timo, LaValle, Steven M., Pouke, Matti
Mobile telepresence robots allow users to feel present and explore remote environments using technology. Traditionally, these systems are implemented using a camera onboard a mobile robot that can be controlled. Although high-immersion technologies, such as 360-degree cameras, can increase situational awareness and presence, they also introduce significant challenges. Additional processing and bandwidth requirements often result in latencies of up to seconds. The current delay with a 360-degree camera streaming over the internet makes real-time control of these systems difficult. Working with high-latency systems requires some form of assistance to the users. This study presents a novel way to utilize optical flow to create an illusion of self-motion to the user during the latency period between user sending motion commands to the robot and seeing the actual motion through the 360-camera stream. We find no significant benefit of using the self-motion illusion to performance or accuracy of controlling a telepresence robot with a latency of 500 ms, as measured by the task completion time and collisions into objects. Some evidence is shown that the method might increase virtual reality (VR) sickness, as measured by the simulator sickness questionnaire (SSQ). We conclude that further adjustments are necessary in order to render the method viable.
- Europe > Finland > Northern Ostrobothnia > Oulu (0.06)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Netherlands > South Holland > Leiden (0.04)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study > Negative Result (0.54)
- Transportation (0.47)
- Government (0.37)
Teledrive: An Embodied AI based Telepresence System
Banerjee, Snehasis, Paul, Sayan, Roychoudhury, Ruddradev, Bhattacharya, Abhijan, Sarkar, Chayan, Sau, Ashis, Pramanick, Pradip, Bhowmick, Brojeshwar
This article presents Teledrive, a telepresence robotic system with embodied AI features that empowers an operator to navigate the telerobot in any unknown remote place with minimal human intervention. We conceive Teledrive in the context of democratizing remote care-giving for elderly citizens as well as for isolated patients, affected by contagious diseases. In particular, this paper focuses on the problem of navigating to a rough target area (like bedroom or kitchen) rather than pre-specified point destinations. This ushers in a unique AreaGoal based navigation feature, which has not been explored in depth in the contemporary solutions. Further, we describe an edge computing-based software system built on a WebRTC-based communication framework to realize the aforementioned scheme through an easy-to-use speech-based human-robot interaction. Moreover, to enhance the ease of operation for the remote caregiver, we incorporate a person following feature, whereby a robot follows a person on the move in its premises as directed by the operator. Moreover, the system presented is loosely coupled with specific robot hardware, unlike the existing solutions. We have evaluated the efficacy of the proposed system through baseline experiments, user study, and real-life deployment.
- North America > United States > New York > New York County > New York City (0.04)
- Europe > Austria (0.04)
- Asia > Singapore (0.04)
- (2 more...)
- Questionnaire & Opinion Survey (0.87)
- Research Report (0.82)
- Leisure & Entertainment (0.88)
- Information Technology (0.67)
- Health & Medicine > Therapeutic Area (0.46)
- Media > Film (0.34)
Team Northeastern's Approach to ANA XPRIZE Avatar Final Testing: A Holistic Approach to Telepresence and Lessons Learned
Luo, Rui, Wang, Chunpeng, Keil, Colin, Nguyen, David, Mayne, Henry, Alt, Stephen, Schwarm, Eric, Mendoza, Evelyn, Padır, Taşkın, Whitney, John Peter
This paper reports on Team Northeastern's Avatar system for telepresence, and our holistic approach to meet the ANA Avatar XPRIZE Final testing task requirements. The system features a dual-arm configuration with hydraulically actuated glove-gripper pair for haptic force feedback. Our proposed Avatar system was evaluated in the ANA Avatar XPRIZE Finals and completed all 10 tasks, scored 14.5 points out of 15.0, and received the 3rd Place Award. We provide the details of improvements over our first generation Avatar, covering manipulation, perception, locomotion, power, network, and controller design. We also extensively discuss the major lessons learned during our participation in the competition.
- Research Report (1.00)
- Contests & Prizes (0.83)
- Information Technology > Communications > Networks (1.00)
- Information Technology > Artificial Intelligence > Robots (1.00)
VR-controlled robots are being designed to treat injured soldiers
If you think of robots in the military, your mind may conjure dystopian images of science-fiction battlefields with AI-powered machines trading laser fire. But in a much more humane application, UK researchers are developing a potentially lifesaving medical system equivalent to a VR triage video call. University of Sheffield researchers are working on a telepresence system to treat military personnel during combat. The plan is for offsite medics to don virtual reality headsets and control a battlefield robot. The machine can take the patient's vitals with the same technology used in robotic surgery.
- Health & Medicine (1.00)
- Government > Military > Army (1.00)
Patel
We introduce the Beam, a collaborative autonomous mobile service robot, based on SuitableTech's Beam telepresence system. We present a set of enhancements to the telepresence system, including autonomy, human awareness, increased computation and sensing capabilities, and integration with the popular Robot Operating System (ROS) framework. Together, our improvements transform the Beam into a lowcost platform for research on service robots. We examine the Beam on target search and object delivery tasks and demonstrate that the robot achieves a 100% success rate.
ISS Astronauts Operating Remote Robots Show Future of Planetary Exploration
In late August, an astronaut on board the International Space Station remotely operated a humanoid robot to inspect and repair a solar farm on Mars--or at least a simulated Mars environment, which in this case is a room with rust-colored floors, walls, and curtains at the German Aerospace Center, or DLR, in Oberpfaffenhofen, near Munich. European Space Agency astronaut Paolo Nespoli commanded the humanoid, called Rollin' Justin, as the robot performed a series of navigation, maintenance, and repair tasks. Instead of relying on direct teleoperation, Nespoli used a tablet computer to issue high-level commands to the robot. In one task, he used the tablet to position the robot and have it take pictures from different angles. Another command instructed Justin to grasp a cable and connect it to a data port.
- Government > Space Agency (0.73)
- Energy > Renewable > Solar (0.38)